The effect of non-optimal bases on the convergence of Krylov subspace methods
نویسندگان
چکیده
There are many examples where non-orthogonality of a basis for Krylov subspace methods arises naturally. These methods usually require less storage or computational effort per iteration than methods using an orthonormal basis (optimal methods), but the convergence may be delayed. Truncated Krylov subspace methods and other examples of non-optimal methods have been shown to converge in many situations, often with small delay, but not in others. We explore the question of what is the effect of having a nonoptimal basis. We prove certain identities for the relative residual gap, i.e., the relative difference between the residuals of the optimal and non-optimal methods. These identities and related bounds provide insight into when the delay is small and convergence is achieved. Further understanding is gained by using a general theory of superlinear convergence recently developed. Our analysis confirms the observed fact that in exact arithmetic the orthogonality of the basis is not important, only the need to maintain linear independence is. Numerical examples illustrate our theoretical results.
منابع مشابه
Preconditioned Generalized Minimal Residual Method for Solving Fractional Advection-Diffusion Equation
Introduction Fractional differential equations (FDEs) have attracted much attention and have been widely used in the fields of finance, physics, image processing, and biology, etc. It is not always possible to find an analytical solution for such equations. The approximate solution or numerical scheme may be a good approach, particularly, the schemes in numerical linear algebra for solving ...
متن کاملNew variants of the global Krylov type methods for linear systems with multiple right-hand sides arising in elliptic PDEs
In this paper, we present new variants of global bi-conjugate gradient (Gl-BiCG) and global bi-conjugate residual (Gl-BiCR) methods for solving nonsymmetric linear systems with multiple right-hand sides. These methods are based on global oblique projections of the initial residual onto a matrix Krylov subspace. It is shown that these new algorithms converge faster and more smoothly than the Gl-...
متن کاملOn the Superlinear Convergence of Exact and Inexact Krylov Subspace Methods
We present a general analytical model which describes the superlinear convergence of Krylov subspace methods. We take an invariant subspace approach, so that our results apply also to inexact methods, and to non-diagonalizable matrices. Thus, we provide a unified treatment of the superlinear convergence of GMRES, Conjugate Gradients, block versions of these, and inexact subspace methods. Numeri...
متن کاملAccelerating Convergence by Augmented Rayleigh-Ritz Projections For Large-Scale Eigenpair Computation
Iterative algorithms for large-scale eigenpair computation are mostly based subspace projections consisting of two main steps: a subspace update (SU) step that generates bases for approximate eigenspaces, followed by a Rayleigh-Ritz (RR) projection step that extracts approximate eigenpairs. A predominant methodology for the SU step makes use of Krylov subspaces that builds orthonormal bases pie...
متن کاملOn the Occurrence of Superlinear Convergence of Exact and Inexact Krylov Subspace Methods
Krylov subspace methods often exhibit superlinear convergence. We present a general analytic model which describes this superlinear convergence, when it occurs. We take an invariant subspace approach, so that our results apply also to inexact methods, and to non-diagonalizable matrices. Thus, we provide a unified treatment of the superlinear convergence of GMRES, Conjugate Gradients, block vers...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Numerische Mathematik
دوره 100 شماره
صفحات -
تاریخ انتشار 2005